Dynamic Classifier Selection by Adaptive k-Nearest-Neighbourhood Rule
نویسندگان
چکیده
Despite the good results provided by Dynamic Classifier Selection (DCS) mechanisms based on local accuracy in a large number of applications, the performances are still capable of improvement. As the selection is performed by computing the accuracy of each classifier in a neighbourhood of the test pattern, performances depend on the shape and size of such a neighbourhood, as well as the local density of the patterns. In this paper, we investigated the use of neighbourhoods of adaptive shape and size to better cope with the difficulties of a reliable estimation of local accuracies. Reported results show that performance improvements can be achieved by suitably tuning some additional parameters.
منابع مشابه
Learning Vector Quantization With Alternative Distance Criteria
An adaptive algorithm for training of a Nearest Neighbour (NN) classifier is developed in this paper. This learning rule has got some similarity to the well-known LVQ method, but using the nearest centroid neighbourhood concept to estimate optimal locations of the codebook vectors. The aim of this approach is to improve the performance of the standard LVQ algorithms when using a very small code...
متن کاملRIONA: A Classifier Combining Rule Induction and k-NN Method with Automated Selection of Optimal Neighbourhood
The article describes a method combining two widely-used empirical approaches: rule induction and instance-based learning. In our algorithm (RIONA) decision is predicted not on the basis of the whole support set of all rules matching a test case, but the support set restricted to a neighbourhood of a test case. The size of the optimal neighbourhood is automatically induced during the learning p...
متن کاملFeature Extraction Using Genetic Algorithms
This paper summarizes our research on feature selection and extraction from high-dimensionality data sets using genetic algorithms. We have developed a GA-based approach utilizing a feedback linkage between feature evaluation and classification. That is, we carry out feature selection or feature extraction simultaneously with classifier design, through “genetic learning and evolution.” This app...
متن کاملAsymmetric Neighbourhood Selection and Support Aggregation for Effective Classification
The k-Nearest-Neighbours (kNN) is a simple but effective method for classification. The success of kNN in classification depends on the selection of a “good value” for k. To reduce the bias of k and take account of the different roles or influences that features play with respect to the decision attribute, we propose a novel asymmetric neighbourhood selection and support aggregation method in t...
متن کاملThe Pair-wise Linear Classifier and the K-nn Rule in Application to Als Progression Differentiation
The two kinds of classifier based on the k-NN rule, the standard and the parallel version, were used for recognition of severity of ALS disease. In case of the second classifier version, feature selection was done separately for each pair of classes. The error rate, estimated by the leave one out method, was used as a criterion as for determination the optimum values of k’s as well as for featu...
متن کامل